Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

JAVA-2563 and JAVA-2564 #20

Merged
merged 65 commits into from
Mar 24, 2020
Merged

JAVA-2563 and JAVA-2564 #20

merged 65 commits into from
Mar 24, 2020

Conversation

absurdfarce
Copy link
Collaborator

No description provided.

… used in future-handling code) to a generic callback trait
Big chunks of this are based on/copied wholesale from previous work done by Arthur Landim (datastax#17)
…propogates through the statement/attribute level but doesn't make it up to the action level (since the actions don't seem to care about the underlying Statement type their executing... or at least they don't so far).
… replaced with accessor for coordinator node. Also cleaned up a fair amount of DseResponse.
…in these changes. Still need to update the bits around the explicit interaction with the driver.
…RequestAction... should make supporting the setting of various options from DseCqlAttributes considerably easier.
@absurdfarce absurdfarce added the wip label Dec 4, 2019
…er than statements specifically. Facilitates

conversion to builders generally.

Also converted CQL statements to use builders in order to address immutable statement classes
…er now. Still need to do the same for graph.

Some things have moved into driver configuration, most notably the retry policy.

Also simplified the DseResponse API.  Primary accessor here is for the (Graph|Async)ResultSet... these vals can then
be converted to Seqs using ResultSetUtils and managed via Scala's collection API.  Preserved as many of the existing
checks as possible mainly for backward compatibility; going forward many users would migrate to a Seq-based impl.

The DseResponse API changes were (largely) a result of the distinction between ResultSet and AsyncResultSet in the 4.x
driver.
@@ -23,32 +26,38 @@ import io.gatling.core.check.Check
* @param cl Consistency Level to be used
* @param cqlChecks Data-level checks to be run after response is returned
* @param genericChecks Low-level checks to be run after response is returned
* @param userOrRole User or role to be used when proxy auth is enabled
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Out of curiosity, what is the recommended way to set up proxy authentication now?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Presumably like the retry policy this has now moved to the config... ?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not exactly, I am afraid.

What you are talking about (the config) is when one wants to send all the queries using a proxi'ed user. It has indeed moved to configuration, as described in the Proxy Authentication section of the driver docs.

This part here is different. The goal if to give the user a per-query granularity. See for instance this RLAC benchmark. We connect to DSE with a proxyauth user. Then, a single Gatling scenario simulates that this proxyauth users send query for tends of different users, given it has been granted the PROXY.EXECUTE role beforehand. The goal of this scenario is to figure out whether proxy execution has an overhead.

Think of this as a multi-tenant application, where you want the data of each tenant to not be accessible to anyone else. You can have a single driver connection, and yet have strict data visibility rules.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the clarification @pingtimeout . In a moment of synchronicity I'm also working on porting dse-benchmarks over to use the new driver (see https://github.com/riptano/dse-benchmarks/pull/230/files) and I just happened to hit the RLAC benchmarks just after reading this. T'is a curious universe, this one.

So... I think the best (only?) mechanism we have for addressing this is exposing the per-request ExecutionProfile settings and requiring users to define profiles for each user role they'll be requesting. Does that sound right to you @adutra or is there an easier mechanism I'm not aware of?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hang on, I think I can handle this via the ProxyAuthentication class as described in https://docs.datastax.com/en/developer/java-driver/4.4/manual/core/authentication/#proxy-execution. Working on that now...

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay, I think that resolved it. @adutra and/or @olim7t would one/both of you mind taking a look at cca2ef9 and confirming that this usage matches what is expected?

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, that's the way ProxyAuthentication is meant to be used. Glad that you didn't have to expose execution-profiles, at least not for now.

@@ -6,7 +6,15 @@

package com.datastax.gatling.plugin.model

import com.datastax.driver.core.{PreparedStatement, SimpleStatement}
import com.datastax.oss.driver.api.core.cql.{
BatchStatement => BatchS,
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Open question: would it help to name driver types with a common prefix/suffix to make them more easily distinguishable? E.g. DriverBatchStmt, DriverBatchBuilder...

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Personally I don't think so; I think the names as we have them are distinct enough while still having the advantage of brevity.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why even alias them? They don't clash with any project types.

This makes the code harder to read. Granted, it's not a big deal if you read the file top to bottom, but consider other common situations: jumping to a method definition from another file, reading a diff that might not include the imports, etc.

Brevity doesn't matter, you can use IDE completion.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I disagree; I'd argue brevity does matter for reading comprehension. The repetition of "Statement" here is just noise.

Copy link

@olim7t olim7t Jan 30, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's not noise, it's part of the name that precisely identifies the driver type. A name that anyone from the team instantly recognizes, even if they've not worked on the gatling plugin before.

Yes, I can read the code with the aliases, but "SimpleStatementBuilder => SimpleB" is just another piece of information that I have to keep in my head. If I forgot, I have to go check the imports (assuming I'm reading the full file, not a diff), or maybe my IDE can show me a tooltip (assuming I am in my IDE, not on Github or patching something with vi on a CI box).

Things could get worse if someone else uses their own aliases in another project (or even different parts of the same project), why not BoS and BoB. Now I have to remember a different "dialect" depending on what I'm looking at. Or maybe they use SimpleB for something else (javax.script.SimpleBindings?). If I don't have an IDE available and need to grep the codebase for every reference to SimpleStatementBuilder, it gets hairy...

Renaming clauses have their use when the type you want to import clashes with something that you already have in scope:

// Assuming we also intend to use scala.collection.immutable.Map
import java.util.{Map => JavaMap}

That's not the case here, so why introduce an unnecessary level of indirection?

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree with @olim7t , the Statement part of the type name doesn't seem to be a big problem - the name is not very long and we are using those types everywhere.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I completely disagree, but in the spirit of trying to get this wrapped up I'll make the change. For me this is nearly identical to imports of static methods which can go a long way towards reducing the verbiage in a given class. The question of viewing this code in a diff feels like something of a canard to me as diffs always have limited context by their very nature.

trait DseCqlStatement extends DseStatement[Statement] {
def buildFromSession(session: Session): Validation[Statement]
}
trait DseCqlStatement[T <: Statement[T], B <: StatementBuilder[B,T]] extends DseStatement[B]
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note that we change the return value of DseStatement.buildFromSession() here. In prior versions this would return a Statement subclass but as of now it returns a StatementBuilder subclass. This change facilitates support for immutable statement objects in 2.x; the request Action impl (in this case CqlRequestAction) can operate on a single builder rather than producing lots of intermediate statement objects, one for each defined attribute in DseCqlAttributes.

trait DseCqlStatement extends DseStatement[Statement] {
def buildFromSession(session: Session): Validation[Statement]
}
trait DseCqlStatement[T <: Statement[T], B <: StatementBuilder[B,T]] extends DseStatement[B]
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also wanted to be much more explicit about the type constraints here, including the recursive self-types and the relationship between statement and statement builder types. This last part became more important once we settled on returning a builder from buildFromSession().

Copy link

@olim7t olim7t Jan 30, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💯 to use builders.

As for the type parameters, I find them frustrating because they propagate everywhere in the codebase, but we never end up using them anywhere: everything we call on a B is defined on the parent StatementBuilder type, and everything we do with a T could be done with a Statement[_] (session.executeAsync, pattern-matching when we extract the query, etc).

It all boils down to that annoying SelfT on StatementBuilder. But note that we don't use that "self-type" either: the builder is only invoked in a mutable fashion:

dseAttributes.enableTrace.foreach(builder.setTracing)

So I think we could simply declare the builder type like this:

trait DseCqlStatement extends DseStatement[StatementBuilder[_, Statement[_]]]

Then we don't need T and B anywhere else anymore. For example, CqlRequestAction can stay unparameterized and we would have:

private def buildStatement(builder: StatementBuilder[_, Statement[_]]): Statement[_] = {
   ... // body unchanged
}

We can define a type alias for StatementBuilder[_, Statement[_]] in a package object, to avoid repeating it everywhere.

package object model {
  type AnyStatementBuilder = StatementBuilder[_, Statement[_]]
}

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

StatementBuilder[_, Statement[_]]

Yikes, that doesn't work: "type arguments do not conform to class StatementBuilder's type parameter bounds". I forgot that Scala code does not get checked instantly in IDEA 😿
There might not be a way around those two type parameters after all.

Copy link

@olim7t olim7t left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is taking me forever to review ☹️, partly because of my lack of experience with Gatling and Scala in general. A few questions / comments:

Is there a generic guide somewhere on how to write Gatling plugins? It gets hard to follow when you don't know exactly how the code interacts with the Gatling core.

I think a full review by @pingtimeout would be welcome, since this is the first big PR (as far as I know) since we've taken ownership of this project.

Is everything in this changeset strictly related to the driver upgrade? Some things (maybe this?) look like more generic improvements. Could we possibly delay them to follow-up PRs, in order to keep more manageable chunks for reviewers?

@absurdfarce
Copy link
Collaborator Author

Mentioned to @olim7t in a separate conversation but adding here for completeness: I'm unaware of any general purpose guides to Gatling and/or it's plugins beyond the docs at gatling.io... and those aren't always super-helpful, especially about more detailed stuff. @pingtimeout may have some better resources to pass along.

I'll certainly welcome whatever comments Pierre has to offer on this change. He's got a request out for more detailed documentation on the API changes in this PR that I still need to address but I'm a solid 👍 to any additional comments he'd like to add.

There certainly are some changes in this PR that are not strictly related to the driver upgrade, most notably the example cited by Olivier above. However, in the case of that change the simplification to the check API significantly reduced the amount of code that had to be changed, which had the net effect of actually reducing the effort required to complete the upgrade. The change also made the check API more flexible and would be a bit hard to decouple now (since the new API is in use in multiple Gatling simulations both in the tests of this project and in other work on dse-benchmarks) so for all those reasons I'd argue against removing it from this PR.

That's really the biggest example of a change not directly driven by the driver upgrade. There are a few other tweaks here and there but not much else of any real size.

* @param idempotent Set request to be idempotent i.e. whether it can be applied multiple times
* @param defaultTimestamp Set default timestamp on request, overriding current system time
* @param node Set the node that should handle this query
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Isn't that node parameter new? IIRC with the previous driver, it was not possible to directly choose which node would execute a query. I am not sure how it should be used. Does it replace the whitelist LBP ?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe it's equivalent to setHost(Host) in the old API... just with a different type now

* @param idempotent Set request to be idempotent i.e. whether it can be applied multiple times
* @param defaultTimestamp Set default timestamp on request, overriding current system time
* @param node Set the node that should handle this query
* @param customPayload Custom payload for this request
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unless I am mistaken, this field is never used. The only place where custom payloads are used is when users write something like

val stmt = new SimpleStatement(...)
//...
cql("My request").executeCustomPayload(stmt, "sessionKey")

There is no code path that leads to writing to this DseCqlAttributes::customPayload field, I suggest to remove it entirely.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The parallel here is to addCustomPayload() in the old API. I think I just missed adding it in DseCqlAttributesBuilder, so my inclination is to just add it there in order to make sure the API is complete. Does that sound right @pingtimeout ?

extends DseCqlStatement {
case class DseCqlBoundStatementNamed(cqlTypes: CqlPreparedStatementUtil,
preparedStatement: PreparedStatement,
builderFn: (BoundStatement) => BoundStatementBuilder)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not quite sure I understand why the new builderFn parameter is for. Could you describe its utility?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As far as I can tell, it seems that it is a way to avoid calling new BoundStatementBuilder(...) when running tests, because otherwise many getters would need to be mock'd. If so, then there might be a better form, following the same design.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here is an alternative design.

You could first create a companion object for DseCqlStatement that would contain the default builder as follows:

object DseCqlStatement {
  private[model] val defaultBuilder: BoundStatement => BoundStatementBuilder = new BoundStatementBuilder(_)
}

Then, import it so that you have a shortcut to reference it. And everywhere you used to request a default builder, you now add a default value for that parameter. For example, in DseCqlBoundStatementName, it looks like this:

import com.datastax.gatling.plugin.model.DseCqlStatement.defaultBuilder

// [...]

case class DseCqlBoundStatementNamed(cqlTypes: CqlPreparedStatementUtil,
                                     preparedStatement: PreparedStatement,
                                     builderFn: BoundStatement => BoundStatementBuilder = defaultBuilder)
  extends DseCqlStatement[BoundStatement, BoundStatementBuilder] {
// [...]
}

That should allow you to remove a lot of companion objects that you created to hold these default builders. The test code stays the same, in that it can override the builder function when needed.

The only gotcha will happen in DseCqlBoundStatementWithPassedParams. It currently uses a varargs of Expression[AnyRef], which cannot be used with default parameter values. So you will need to change it so that it becomes a Seq[Expression[AnyRef]]. With the optional builder parameter being last, it will look like this:

case class DseCqlBoundStatementWithPassedParams(cqlTypes: CqlPreparedStatementUtil,
                                                preparedStatement: PreparedStatement,
                                                params: Seq[Expression[AnyRef]],
                                                builderFn: BoundStatement => BoundStatementBuilder = defaultBuilder)
// [...]

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Your intuition is entirely correct @pingtimeout . My original version of this code looked very much like your proposed solution but the issue with DseCqlBoundStatementWithPassedParams made me nervous; I didn't want to introduce yet more API changes as part of this PR. So I settled on the version implemented here, which is really just an extra constructor on the case class.

I still don't love the idea of converting the params in DseCqlBoundStatementWIthPassedParams into a Seq... I kinda like the vararg behaviour you have now. Fortunately, I think I can also solve the problem (and remove all the companion objects) via implicit args. Take a look at the new version and let me know what you think!

})
}

def getIterableClz[T <: Any](session:Session, name:String):Class[_ <: T] = {
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why are we now storing class information in the Gatling session?
master doesn't do that, it looks like it was able to use the setter that doesn't take a class argument:

<T> BoundStatement setSet(int i, Set<T> v)

That still exists in 4.x.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It does? SettableByName appears to just have this:

  @NonNull
  @CheckReturnValue
  default <ElementT> SelfT setSet(
      @NonNull String name, @Nullable Set<ElementT> v, @NonNull Class<ElementT> elementsClass) {
    return setSet(firstIndexOf(name), v, elementsClass);
  }

SettableById looks to be similar. Am I missing something?

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My bad, I should have actually checked 🤦‍♂

That being said, we could easily reimplement it in the plugin code (Java but you get the idea):

  static <SettableT extends SettableByIndex<SettableT>> SettableT setSet(
      SettableT settable, Set<?> set, int index) {
    DataType targetCqlType = settable.getType(index);
    TypeCodec<Object> codec = settable.codecRegistry().codecFor(targetCqlType);
    ByteBuffer encoded = codec.encode(set, settable.protocolVersion());
    return settable.setBytesUnsafe(index, encoded);
  }

It's less type-safe (and why it's not in the driver API anymore), but for the way this is going to be used in Gatling tests I think it's good enough. If I understand correctly, the burden of maintaining the "-clz" session parameters falls onto the Gatling user, that's a lot of extra ceremony for little benefit: type mismatches can only be detected at runtime anyway.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ooooh, I like the idea here a great deal. The "clz" props in the session were added in the absence of a single source of truth for the type information in question but the SettableByName instance certainly could serve in that role. Very interesting indeed.

One thing I don't like, though, is the explicit encoding + byte-level set here... I'm worried about it's effect on code maintainability since it's not quite as obvious what's going on. I'm wondering if there's some way to get the Java type from the codec and then still use the member-type aware setSet() method. You'd wind up doing a double lookup on the codec in that case which isn't... awesome, but maybe not that big a deal.

I want to chew on this idea just a bit more.

Copy link

@olim7t olim7t Feb 19, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You could replace the last two lines by:

return settable.set(index, set, codec.getJavaType());

But yeah, that will look up the codec a second time; so it's doing the exact same thing in a slightly less efficient way.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Had to tweak this just a bit, but I think I have something working now. codec.getJavaType() returns the type of the collection and we need the individual element type here so there's a bit of extra processing... but it looks like it's gonna work fairly nicely. I think it is a bit more readable/maintainable than the direct write option.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@olim7t when you have time would you mind taking a quick pass over the state of this last change? I'd like a non-me set of eyes to look at it and confirm it seems sensible

…the Gatling session from prepared statement support.

Olivier pointed out that the prepared statement itself can serve as a source of truth for type info here.
}

def clzFromCodec(bindable: Bindable[_], genType: DataType): Class[Any] = {
bindable.codecRegistry().codecFor(genType).getJavaType.getRawType.asInstanceOf[Class[Any]]
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Return type and the asInstanceOf here is just to make the compiler happy. The raw class type coming off of getRawType() is a superclass-bounded class instance but the Java type expects Class for defined at the method def. If we just throw the superclass-bounded version at the set*() method scalac complains about a type mismatch... but they are equivalent types underneath. So we change to Class[Any] here to satisfy types at compile-time.

dataType match {
case l: ListType => {
val memberClz = clzFromCodec(bindable, l.getElementType)
bindable.setList(key, asList(gatlingSession, paramName, memberClz), memberClz)
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Have you tested this with nested collections? e.g. List<List<String>>

memberClz would be Class[List].asInstanceOf[Class[Any]], I'm not sure if that selects the right codec.

Also the code is a bit inefficient in that it looks up the codec twice: once in clzFromCodec, and once in setList.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The previous code already failed in the list-of-lists case:

diff --git a/src/test/scala/com/datastax/gatling/plugin/utils/CqlPreparedStatementUtilSpec.scala b/src/test/scala/com/datastax/gatling/plugin/utils/CqlPreparedStatementUtilSpec.scala
index d373c2c..118c431 100644
--- a/src/test/scala/com/datastax/gatling/plugin/utils/CqlPreparedStatementUtilSpec.scala
+++ b/src/test/scala/com/datastax/gatling/plugin/utils/CqlPreparedStatementUtilSpec.scala
@@ -52,6 +52,7 @@ class CqlPreparedStatementUtilSpec extends BaseCassandraServerSpec {
     "setJava" -> Set(1).asJava,
 
     "list" -> List(1),
+    "list-of-lists" -> List(List(1)),
     "listJava" -> List(1).asJava,
 
     "map" -> Map(1 -> 1),
@@ -966,6 +967,12 @@ class CqlPreparedStatementUtilSpec extends BaseCassandraServerSpec {
         result.isSet(15) shouldBe true
       }
 
+      it("should bind with a list of lists") {
+        val result = CqlPreparedStatementUtil.bindParamByOrder(defaultGatlingSession, boundStatementKeys, DataType.Name.LIST, "list-of-lists", 15)
+        result shouldBe a[BoundStatement]
+        result.isSet(15) shouldBe true
+      }
+
       it("should bind with a set") {
         val result = CqlPreparedStatementUtil.bindParamByOrder(defaultGatlingSession, boundStatementKeys, DataType.Name.SET, "set", 16)
         result shouldBe a[BoundStatement]
[info]   - should bind with a list of lists *** FAILED ***                      
[info]     com.datastax.driver.core.exceptions.InvalidTypeException: Invalid type for list<int> element, expecting java.lang.Integer but got class scala.collection.immutable.$col
on$colon                                                                                                                                                                         
[info]     at com.datastax.driver.core.TypeCodec$AbstractCollectionCodec.serialize(TypeCodec.java:1765)
[info]     at com.datastax.driver.core.TypeCodec$AbstractCollectionCodec.serialize(TypeCodec.java:1740)
[info]     at com.datastax.driver.core.AbstractData.setList(AbstractData.java:349)
[info]     at com.datastax.driver.core.BoundStatement.setList(BoundStatement.java:719)
...

This result (plus the idea that list of lists are probably not very likely in Gatling use cases) leads me to argue that while this probably could be improved it feels like something we'd want to address in a follow-up effort, not something that should be addressed as part of the driver upgrade.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Works for me, I agree it's not high priority.

case Some(p: Polygon) =>
boundStatement.set(paramName, asPolygon(gatlingSession, paramName), classOf[Polygon])
bindable.set(paramName, asPolygon(gatlingSession, paramName), classOf[Polygon])
case _ =>
throw new UnsupportedOperationException(s"$paramName on unknown CUSTOM type")
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The user might have registered a custom codec for this non-standard type. I think we should try to look up the codec and manually serialize here (as in the example I included in one of my previous comments).

If the custom type is truly unknown, the codec registry will throw a CodecNotFoundException.

Copy link
Collaborator Author

@absurdfarce absurdfarce Mar 16, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure how much we wan to worry about custom geometry codecs for this use case, but if we do the SettableBy* interfaces support passing an explicit codec so can't we change this to the following and accomplish the same thing?

          case CUSTOM =>
            gatlingSession.attributes.get(paramName) match {
              case Some(p: Point) =>
                bindable.set(key, asPoint(gatlingSession, paramName), bindable.codecRegistry().codecFor(classOf[Point]))
              case Some(p: LineString) =>
                bindable.set(key, asLineString(gatlingSession, paramName), bindable.codecRegistry().codecFor(classOf[LineString]))
              case Some(p: Polygon) =>
                bindable.set(key, asPolygon(gatlingSession, paramName), bindable.codecRegistry().codecFor(classOf[Polygon]))
              case _ =>
                throw new UnsupportedOperationException(s"$paramName on unknown CUSTOM type")
            }

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wasn't thinking in terms of new codecs for existing custom types, but rather new custom types. E.g. someone has their own PriceType on the server, and a PriceCodec added to their session.

For the existing geo types, the code above is ok.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, I see what you mean now. Sounds like another useful enhancement for future. Might be just a little tricky since the match is largely used as a destructuring operation here which might make the types a bit complicated (if the user does something outside of the normal hierarchy) but that hardly seems intractable.

@absurdfarce
Copy link
Collaborator Author

There hasn't been any additional issues raised with this in some time. In the interest of not having this linger around forever I'm going to merge it now.

@absurdfarce absurdfarce merged commit 17a1613 into datastax:master Mar 24, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants